4 research outputs found

    Vision-based automatic landing of a rotary UAV

    Get PDF
    A hybrid-like (continuous and discrete-event) approach to controlling a small multi-rotor unmanned aerial system (UAS) while landing on a moving platform is described. The landing scheme is based on positioning visual markers on a landing platform in a detectable pattern. After the onboard camera detects the object pattern, the inner control algorithm sends visual-based servo-commands to align the multi-rotor with the targets. This method is less computationally complex as it uses color-based object detection applied to a geometric pattern instead of feature tracking algorithms, and has the advantage of not requiring the distance to the objects to be calculated. The continuous approach accounts for the UAV and the platform rolling/pitching/yawing, which is essential for a real-time landing on a moving target such as a ship. A discrete-event supervisor working in parallel with the inner controller is designed to assist the automatic landing of a multi-rotor UAV on a moving target. This supervisory control strategy allows the pilot and crew to make time-critical decisions when exceptions, such as losing targets from the field of view, occur. The developed supervisor improves the low-level vision-based auto-landing system and high-level human-machine interface. The proposed hybrid-like approach was tested in simulation using a quadcopter model in Virtual Robotics Experimentation Platform (V-REP) working in parallel with Robot Operating System (ROS). Finally, this method was validated in a series of real-time experiments with indoor and outdoor quadcopters landing on both static and moving platforms. The developed prototype system has demonstrated the capability of landing within 25 cm of the desired point of touchdown. This auto-landing system is small (100 x 100 mm), light-weight (100 g), and consumes little power (under 2 W)

    Visual servoing for autonomous landing of a multi-rotor UAS on a moving platform

    No full text
    In this paper, a method to control a small multi-rotor unmanned aerial system (UAS) while landing on a moving platform using image-based visual servoing (IBVS) is described. The landing scheme is based on positioning visual markers on a landing platform in the form of a detectable pattern. When the onboard camera detects the object pattern, the flight control algorithm will send visual-based servo-commands to align the multi-rotor with the targets. The main contribution is that proposed method is less computationally expensive as it uses color-based object detection applied to a geometric pattern instead of feature tracking algorithms. This method has the advantage that it does not demand calculating the distance to the objects (depth). The proposed method was tested in simulation using a quadcopter model in V-REP (Virtual Robotics Experimental Platform) working in parallel with Robot Operating System (ROS). Finally, this method was validated in series of real-time experiments with a quadcopter.The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author

    A comparison of two novel approaches for conducting detect and avoid flight test

    No full text
    This paper compares two approaches developed by the National Research Council of Canada to conduct ‘near-miss’ intercepts in flight test, and describes a new method for assessing the efficacy of these trajectories. Each approach used a different combination of flight test techniques and displays to provide guidance to the pilots to set-up the aircraft on a collision trajectory and to maintain the desired path. Approach 1 only provided visual guidance of the relative azimuth and position of the aircraft, whereas Approach 2 established the conflict point (latitude/longitude) from the desired geometry, and provided cross track error from the desired intercept as well as speed cueing for the arrival time. The performance of the approaches was analyzed by comparing the proportion of time where the predicted closest approach distance was below a desired threshold value. The analysis showed that Approach 2 resulted in more than double the amount of time spent at or below desired closest approach distance across all azimuths flown. Moreover, since less time was required to establish the required initial conditions, and to stabilize the flight paths, the authors were able to conduct 50% more intercepts.The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author

    Development of a Novel Implementation of a Remotely Piloted Aircraft System over 25 kg for Hyperspectral Payloads

    No full text
    A main aspect limiting the operation of low-altitude remotely piloted aircraft systems (RPAS) over 25 kg, integrating pushbroom hyperspectral sensors, comes from the challenges related to aircraft performance (e.g., flight time) and regulatory aspects deterring the users from pushing beyond this weight limit. In this study, we showcase a novel implementation using the DJI Agras T30 as an aerial system for integrating an advanced hyperspectral imager (HSI, Hyspex VS-620). We present the design and fabrication approach applied to integrate the HSI payload, the key considerations for powering the HSI and its gimbal, and the results from vibration and wind tunnel tests. We also evaluate the system’s flight capacity and the HSI’s geometric and radiometric data qualities. The final weight of the T30 after the integration of the HSI payload and ancillary hardware was 43 kg. Our vibration test showed that the vibration isolator and the gimbal reduced the vibration transmission to above 15 Hz but also introduced a resonant peak at 9.6 Hz that led to vibration amplification in the low-frequency range near 9.6 Hz (on the order of an RMS of ~0.08 g). The wind tunnel test revealed that the system is stable up to nearly twice the wind speed rating of the manufacturer’s specifications (i.e., 8 m/s). Based on the requirements of the Canadian Special Flight Operations Certificate (RPAS > 25 kg) to land at a minimal battery level of ≥30%, the system was able to cover an area of ~2.25 ha at a speed of 3.7 m/s and an altitude of 100 m above ground level (AGL) in 7 min. The results with the HSI payload at different speeds and altitudes from 50 m to 100 m AGL show hyperspectral imagery with minimal roll–pitch–yaw artefacts prior to geocorrection and consistent spectra when compared to nominal reflectance targets. Finally, we discuss the steps followed to deal with the continuously evolving regulatory framework developed by Transport Canada for systems > 25 kg. Our work advances low-altitude HSI applications and encourages remote sensing scientists to take advantage of national regulatory frameworks, which ultimately improve the overall quality of HSI data and safety of operations with RPAS > 25 kg
    corecore